122 research outputs found

    Human Behavioral Metrics of a Predictive Model Emerging During Robot Assisted Following Without Visual Feedback

    Get PDF
    Robot assisted guiding is gaining increased interest due to many applications involving moving in noisy and low visibility environments. In such cases, haptic feedback is the most effective medium to communicate. In this paper, we focus on perturbation based haptic feedback due to applications like guide dogs for visually impaired people and potential robotic counterparts providing haptic feedback via reins to assist indoor firefighting in thick smoke. Since proprioceptive sensors like spindles and tendons are part of the muscles involved in the perturbation, haptic perception becomes a coupled phenomenon with spontaneous reflex muscle activity. The nature of this interplay and how the model based sensory-motor integration evolves during haptic based guiding is not well understood yet. In this study, we asked human followers to hold the handle of a hard rein attached to a 1-DoF robotic arm that gave perturbations to the hand to correct an angle error of the follower. We found that human followers start with a 2nd order reactive autoregressive following model and changes it to a predictive model with training. The post-perturbation Electromyography (EMG) activity exhibited a reduction in co-contraction of muscles with training. This was accompanied by a reduction in the leftward/rightward asymmetry of a set of followers behavioural metrics. These results show that the model based prediction accounts for the internal coupling between proprioception and muscle activity during perturbation responses. Furthermore, the results provide a firm foundation and measurement metrics to design and evaluate robot assisted haptic guiding of humans in low visibility environments

    A Variable Stiffness Robotic Probe for Soft Tissue Palpation

    Get PDF
    During abdominal palpation diagnosis, a medical practitioner would change the stiffness of their fingers in order to improve the detection of hard nodules or abnormalities in soft tissue to maximize the haptic information gain via tendons. Our recent experiments using a controllable stiffness robotic probe representing a human finger also confirmed that such stiffness control in the finger can enhance the accuracy of detecting hard nodules in soft tissue. However, the limited range of stiffness achieved by the antagonistic springs variable stiffness joint subject to size constraints made it unsuitable for a wide range of physical examination scenarios spanning from breast to abdominal examination. In this letter, we present a new robotic probe based on a variable lever mechanism able to achieve stiffness ranging from 0.64 to 1.06 N·m/rad that extends the maximum stiffness by around 16 times and the stiffness range by 33 times. This letter presents the mechanical model of the novel probe, the finite element simulation as well as experimental characterization of the stiffness response for lever actuation.This work was supported by The United Kingdom Engineering and Physical Sciences Research Council under MOTION Grant EP/N03211X/2

    The role of morphology of the thumb in anthropomorphic grasping : a review

    Get PDF
    The unique musculoskeletal structure of the human hand brings in wider dexterous capabilities to grasp and manipulate a repertoire of objects than the non-human primates. It has been widely accepted that the orientation and the position of the thumb plays an important role in this characteristic behavior. There have been numerous attempts to develop anthropomorphic robotic hands with varying levels of success. Nevertheless, manipulation ability in those hands is to be ameliorated even though they can grasp objects successfully. An appropriate model of the thumb is important to manipulate the objects against the fingers and to maintain the stability. Modeling these complex interactions about the mechanical axes of the joints and how to incorporate these joints in robotic thumbs is a challenging task. This article presents a review of the biomechanics of the human thumb and the robotic thumb designs to identify opportunities for future anthropomorphic robotic hands

    An Optimal State Dependent Haptic Guidance Controller via a Hard Rein

    Get PDF
    The aim of this paper is to improve the optimality and accuracy of techniques to guide a human in limited visibility & auditory conditions such as in fire-fighting in warehouses or similar environments. At present, teams of breathing apparatus (BA) wearing fire-fighters move in teams following walls. Due to limited visibility and high noise in the oxygen masks, they predominantly depend on haptic communication through reins. An intelligent agent (man/machine) with full environment perceptual capabilities is an alternative to enhance navigation in such unfavorable environments, just like a dog guiding a blind person. This paper proposes an optimal state-dependent control policy to guide a follower with limited environmental perception, by an intelligent and environmentally perceptive agent. Based on experimental systems identification and numerical simulations on human demonstrations from eight pairs of participants, we show that the guiding agent and the follower experience learning for a optimal stable state-dependent a novel 3rd and 2nd order auto regressive predictive and reactive control policies respectively. Our findings provide a novel theoretical basis to design advanced human-robot interaction algorithms in a variety of cases that require the assistance of a robot to perceive the environment by a human counterpart

    Disposable soft 3 axis force sensor for biomedical applications

    Get PDF

    A two party haptic guidance controller via a hard rein

    Get PDF
    In the case of human intervention in disaster response operations like indoor firefighting, where the environment perception is limited due to thick smoke, noise in the oxygen masks and clutter, not only limit the environmental perception of the human responders, but also causes distress. An intelligent agent (man/machine) with full environment perceptual capabilities is an alternative to enhance navigation in such unfavorable environments. Since haptic communication is the least affected mode of communication in such cases, we consider human demonstrations to use a hard rein to guide blindfolded followers with auditory distraction to be a good paradigm to extract salient features of guiding using hard reins. Based on numerical simulations and experimental systems identification based on demonstrations from eight pairs of human subjects, we show that, the relationship between the orientation difference between the follower and the guider, and the lateral swing patterns of the hard rein by the guider can be explained by a novel 3rd order auto regressive predictive controller. Moreover,by modeling the two party voluntary movement dynamics using a virtual damped inertial model, we were able to model the mutual trust between two parties. In the future, the novel controller extracted based on human demonstrations can be tested on a human-robot interaction scenario to guide a visually impaired person in various applications like fire fighting, search and rescue, medical surgery, etc

    Human-Aware Robot Navigation by Behavioral Metrics Based on Predictive Model

    Get PDF
    Human-aware robot navigation is very important in many applications in human-robot shared environments. There are some situations, people have to move with less visual and auditory perceptions. In that case, the robot can help to enhance the efficiency of navigation when moving in noisy and low visibility conditions. In that scenario, haptic is the best way to communicate when other modalities are less reliable. We used a rein to guide a human when 1-DoF robotic arm can perturb the humans’ arm to guide into a desired point. The novelty of our work is presenting behavioral metrics based on novel predictive model to strategically position the humans in human-robot shared environment in low visibly and auditory conditions. We found that humans start with a second order reactive autoregressive following model and changes it to a predictive model with training. This result would help us to enhance humans’ safety and comfort in robot leading navigation in shared environment

    Wearable Haptic Based Pattern Feedback Sleeve System

    Get PDF
    This paper presents how humans trained in primitive hap- tic based patterns using a wearable sleeve, can recognize their scaling and shifting. The wearable sleeve consisted of 7 vibro-actuators to stimulate subjects arm to convey the primitive haptic based patterns. The used primitive haptic patterns are the Gaussian template (T), shifted right (R), shifted left (L), half Gaussian (H), and shrink (S) hereafter denoted by templates. The results of this paper would give an idea as to how humans mentally construct the cutaneous feedback in different scenarios such as shifting and scaling with respect to trained patterns and how they recognize all trained patterns when played randomly. These insights will help to develop more e�efficient haptic feedback systems us- ing a small number of templates to be learnt to encode complex haptic messages. Therefore, the results provide new insights and design guide- lines/algorithm to convey messages encoded in vibro-tactile actuator arrays specially in where vision and audition are less reliable scenarios like search and rescue, factories. For example, the results would be used to convey a message to the human to give an idea of the shape and stiffness of obstacles that come into contact with the robot during haptic based guiding in low visibility conditions in human-robot interactions
    • …
    corecore